34 research outputs found

    Multiple Texts as a Limiting Factor in Online Learning: Quantifying (Dis-)similarities of Knowledge Networks across Languages

    Full text link
    We test the hypothesis that the extent to which one obtains information on a given topic through Wikipedia depends on the language in which it is consulted. Controlling the size factor, we investigate this hypothesis for a number of 25 subject areas. Since Wikipedia is a central part of the web-based information landscape, this indicates a language-related, linguistic bias. The article therefore deals with the question of whether Wikipedia exhibits this kind of linguistic relativity or not. From the perspective of educational science, the article develops a computational model of the information landscape from which multiple texts are drawn as typical input of web-based reading. For this purpose, it develops a hybrid model of intra- and intertextual similarity of different parts of the information landscape and tests this model on the example of 35 languages and corresponding Wikipedias. In this way the article builds a bridge between reading research, educational science, Wikipedia research and computational linguistics.Comment: 40 pages, 13 figures, 5 table

    Investigation of enhanced biological phosphorus removal characteristics in İzmir Wastewater Treatment Plant

    Get PDF
    Phosphorus (P) is an essential nütrient for all life forms. It is also one of the limited and non-renewable natural resources. Furthermore, treated wastewater containing high level of P could cause serious problems associated with eutrophication in the receiving water bodies (Janssen et al., 2002). Removal of nutrients by biological methods is cost effective and environmentally sound alternative to the chemical treatment of wastewater (Osee et al., 1997). İzmir bay is one of the great natural bays of the Aegean Sea. Total surface area of the bay is 500 km2 and total water volume is 11.5 billion m3 (Kucuksezgin et al., 2005). To prevent discharge of untreated wastewaters into the bay, İzmir WWTP was taken into operation in early 2000. The plant was designed to treat both domestic and pre treated industrial wastewater collecting from the İzmir metropolitan area. Since previous scientific investigations indicated that both Nitrogen (N) and P concentrations of the sea were critical level with respect to eutrophication problem, the plant design was performed for the combined removal of Carbon (C), N, P in the activated sludge process following adequate physical treatment including fine screens, aerated grit removal chambers and circular primary sedimentation tanks. The average design capacity of the plant is approximately 605.800 m3/d.  Mechanism of EBPR is based on selection of P accumulating microorganisms (PAOs) in the activated sludge culture by exposing the microorganisms into anaerobic, anoxic and aerobic environments. Preferential selection of PAOs in the system is attributed to energy conversion ability of these microorganisms from storage of simple carbon forms (mainly in the form of volatile fatty acids). In these assimilative reactions energy is derived from hydrolysis of intracellular poly-P reserves (Comeau et al, 1986; Mino et al., 1998). Generated energy from the P release in the anaerobic zone is used for transportation of volatile fatty acids (VFAs) from bulk liquid interface to the cell inventory of PAOs. These substrate forms are stored in the intracellular environment as polyhydroxy­alkanoates (PHA). In the following aerobic and anoxic zones of the EBPR process, stored PHA is utilized to generate required energy for reproduction of new cells, maintenance and restoring depleted poly-P reserves using electron acceptors either in the form of dissolved oxygen (DO) or nitrate (USEPA, 1987; Kuba et al., 1996; Lee et al., 2003;Panswad et al., 2007 ). Availability of readily biodegradable COD (rbCOD) in the anaerobic zone is among the essential considerations. It was reported that at least 20 mg as acetic acid (Janssen et al., 2002; Abu-ghararah, 1991), 50 mg as COD (Ekama and Marais, 1984) are required to remove 1 mg of P. According to the previous scientific investigations, EBPR process could be considered as COD limited when the COD/TP ratio is low (<20:1 for settled domestic sewage), whereas it is P limited when the COD/TP ratio is high (Randall et al., 1992). While low COD/TP ratios could cause EBPR failures, very low effluent P concentrations achievable at sufficient COD/TP ratios. Although EBPR systems are well established, there are several unclear issues in identification of microbial dispersal. It was also reported that although several methods developed to analyze the microbial structure of the EBPR process; several critical issues are still unclear (Panswad et al., 2007). Moreover, many scientific investigations in this field were based on laboratory scaled studies. This study aims to investigate fundamental EBPR characteristics and identify microbial responses to variable organic loading rates in the large scale EBPR process. Investigations were conducted in İzmir WWTP, serving 3.5 million population equivalent, between 2006 and 2007. In order to evaluate the EBPR process accurately, influent and effluent wastewater were characterized for various forms of nutrients. Mass balances were performed around anaerobic, anoxic and aerobic zones considering all main and side streams. All required data for accurate evaluation of the EBPR process including environmental and operational variables including pH, temperature, mixed liquor volatile suspended solids concentrations, hydraulic retention times in biological treatment units, sludge age, inflow rate, return sludge and internal recirculation rates were determined during the monitoring period. Batch scale tests were also performed parallel to the full scale investigations to identify microbial responses. All of the experimental results were statistically analyzed and evaluated considering previously obtained theoretical background in the field. Keywords: Enhanced biological phosphorus removal, mass balance, phosphorus accumulating microorganisms, denitrification.  Atıksu arıtımında biyolojik aşırı fosfor giderimi (BAFG) ülkemizde ve Avrupa Birliği ülkelerinde oldukça tercih edilen bir süreç haline gelmiştir. Bu çalışmada değişen organik yükleme hızlarına ve çevresel faktörlere bağlı olarak tam ölçekli bir BAFG sürecinde oluşan biyokimyasal ve mikrobiyolojik değişimler incelenmiştir. BAFG süreçlerinin temel filozofisi, aktif çamurun sırası ile anaerobik, anoksik ve aerobik ortamlarda tutularak fosfor depolama yeteneğine sahip bakteri türlerinin (FDB) süreç içerisinde baskın hale getirilmelerine dayanmaktadır. Bu çalışma kapsamında, büyük ölçekli bir BAFG sürecinin anaerobik, anoksik ve aerobik ortamlarında kütle dengeleri oluşturularak, kinetik bağıntılar yardımı ile sistemde oluşan bakteri türleri ve nütrient giderim hızları tespit edilmeye çalışılmıştır. Yapılan saha çalışmaları laboratuar ölçekli deneylerle desteklenmiştir. İncelemeler neticesinde FDB türlerinin sistemdeki kütlesel oranının %9 ile %34 arasında değiştiği tespit edilmiştir. Daha önceki bilimsel çalışma sonuçlarını doğrular nitelikte, FDB’lerin hücre ağırlığının %32'si oranında, aşırı miktarda fosfor (P) depolayabilecekleri belirlenmiştir. Buna ilave olarak, aktif çamur kültüründeki FDB kütlesel oranının artması ile birlikte sadece P giderim verimliliği değil aynı zamanda karbon (C) ve azot (N) giderim hızlarının da önemli ölçüde artabileceği saptanmıştır. Bu bakteri türlerinin (FDB) sistemdeki oranlarının ise atıksu içerisindeki basit karbon formları ile yakından ilişkili olduğu saptanmış olup, bu konuda yapılmış olan bilimsel çalışmaları doğrular nitelikte sonuçlara ulaşılmıştır. Sistemdeki FDB kütlesel oranının %30'un üzerine çıktığı durumlarda, anoksik ve aerobik P giderim hızlarının 0.1 mg P (g UAKM(Uçucu askıda katı madde)-1 dk-1, denitrifikasyon hızının 0.04 mg NO3-N (g UAKM)-1 dk-1 ve uçucu yağ asidi (UYA) giderim hızının ise 0.5 mg UYA (g UAKM)-1 dk-1 değerlerine ulaşabileceği belirlenmiştir.  Anahtar Kelimeler: Biyolojik aşırı fosfor giderimi, kütle dengesi, fosfor depolayan bakteri, denitrifikasyon.&nbsp

    SOCNET 2018 - Proceedings of the “Second International Workshop on Modeling, Analysis, and Management of Social Networks and Their Applications”

    Get PDF
    Modeling, analysis, control, and management of complex social networks represent an important area of interdisciplinary research in an advanced digitalized world. In the last decade social networks have produced significant online applications which are running on top of a modern Internet infrastructure and have been identified as major driver of the fast growing Internet traffic. The "Second International Workshop on Modeling, Analysis and Management of Social Networks and Their Applications" (SOCNET 2018) held at Friedrich-Alexander-Universität Erlangen-Nürnberg, Germany, on February 28, 2018, has covered related research issues of social networks in modern information society. The Proceedings of SOCNET 2018 highlight the topics of a tutorial on "Network Analysis in Python" complementing the workshop program, present an invited talk "From the Age of Emperors to the Age of Empathy", and summarize the contributions of eight reviewed papers. The covered topics ranged from theoretical oriented studies focusing on the structural inference of topic networks, the modeling of group dynamics, and the analysis of emergency response networks to the application areas of social networks such as social media used in organizations or social network applications and their impact on modern information society. The Proceedings of SOCNET 2018 may stimulate the readers' future research on monitoring, modeling, and analysis of social networks and encourage their development efforts regarding social network applications of the next generation.Die Modellierung, Analyse, Steuerung und das Management komplexer sozialer Netzwerke repräsentiert einen bedeutsamen Bereich interdisziplinärer Forschung in einer modernen digitalisierten Welt. Im letzten Jahrzehnt haben soziale Netzwerke wichtige Online Anwendungen hervorgebracht, die auf einer modernen Internet-Infrastruktur ablaufen und als eine Hauptquelle des rasant anwachsenden Internetverkehrs identifiziert wurden. Der zweite internationale Workshop "Modeling, Analysis and Management of Social Networks and Their Applications" (SOCNET 2018) wurde am 28. Februar 2018 an der Friedrich-Alexander-Universität Erlangen-Nürnberg abgehalten und stellte Forschungsergebnisse zu sozialen Netzwerken in einer modernen Informationsgesellschaft vor. Die SOCNET 2018 Proceedings stellen die Themen eines Tutoriums "Network Analysis in Python" heraus, präsentieren einen eingeladenen Beitrag "From the Age of Emperors to the Age of Empathy" und fassen die Ergebnisse von acht begutachteten wissenschaftlichen Beiträgen zusammen. Die abgedeckten Themen reichen von theoretisch ausgerichteten Studien zur Strukturanalyse thematischer Netzwerke, der Modellierung von Gruppendynamik sowie der Netzwerkanalyse von Rettungseinsätzen bis zu den Anwendungsbereichen sozialer Netzwerke, z.B. der Nutzung sozialer Medien in Organisationen sowie der Wirkungsanalyse sozialer Netzwerkanwendungen in modernen Informationsgesellschaften. Die SOCNET 2018 Proceedings sollen die Leser zu neuen Forschungen im Bereich der Messung, Modellierung und Analyse sozialer Netzwerke anregen und sie zur Entwicklung neuer sozialer Netzwerkapplikationen der nächsten Generation auffordern

    Kredi kartlarının tüketici davranışı üzerine etkisi (Kocaeli Özdilek Alış-Veriş Merkezi örneği)

    No full text
    06.03.2018 tarihli ve 30352 sayılı Resmi Gazetede yayımlanan “Yükseköğretim Kanunu İle Bazı Kanun Ve Kanun Hükmünde Kararnamelerde Değişiklik Yapılması Hakkında Kanun” ile 18.06.2018 tarihli “Lisansüstü Tezlerin Elektronik Ortamda Toplanması, Düzenlenmesi ve Erişime Açılmasına İlişkin Yönerge” gereğince tam metin erişime açılmıştır.ÖZET Anahtar Kelimeler: Kredi kartı, ihtiyaç, tüketim, tüketici davranışı Bu çalışmayla kredi kartlarının tüketici davranışı üzerine ne tür bir etkide bulunduğunu ortaya koymaya yönelik durum tespiti yapılması hedeflenmiştir. Bu nokadan hareketle de Kocaeli il sının dahilinde yer alan Özdilek Alış-veriş Merkezi, araştırmamızda veri toplamak için seçilmiştir. Ayrıca çeşitli bankaların müşteri temsilcileri ile mülakatlat gerçekleştirilmiştir. Toplanan verilerin değerlendirilip yorumlanması neticesinde kredi kartlarının bir noktada kontrolsüz harcamaya neden olan bir unsur olduğu, kullanıcıların büyük çoğunluğunun hangi sosyal sınıfa dahil olursa olsun ama kullanmak ama cebinde bulundurmak maksadıyla kredi kartına sahip olma arzusunda olduğu tespit edilmiştir. Yine kredi kartlarının, insanların zorunlu ve acil ihtiyaçlarını karşılamada önemli bir ödeme aracı olduğu gerçeğine ulaşılmıştır. Her ne kadar günümüz dünyasında insanların maddi ve manevi kaynaklarının büyük bölümünü tüketim için harcadığı-harcama yolunda olduğu söyleniyor olsada bir noktada günümüz insanının zaman zaman içine düştüğü maddi sıkıntıları aşmak için kredi kartlarına ihtiyaç duyduğu, kredi kartlarını kullanmak zorunda olduğu gözden kaçnılmamalıdır

    Multi-document analysis : semantic analysis of large text corpora beyond topic modeling

    No full text
    Viele Methoden wurden in dieser Arbeit vorgestellt, die sich mit dem Hauptziel der automatischen Dokumentenanalyse auf semantischer Ebene befassen. Um das Hauptziel zu erreichen, mussten wir jedoch zunächst eine solide Basis entwickeln, um das Gesamtbild zu vervollständigen. So wurden verschiedene Methoden und Werkzeuge entwickelt, die verschiedene Aspekte des NLP abdecken. Das Zusammenspiel dieser Methoden ermöglichte es, unser Ziel erfolgreich zu erreichen. Neben der automatischen Dokumentenanalyse legen wir großen Wert auf die drei Prinzipien von Effizienz, Anwendbarkeit und Sprachunabhängigkeit. Dadurch waren die entwickelten Tools für die Anwendungen bereit. Die Größe und Sprache der zu analysierenden Daten ist kein Hindernis mehr, zumindest für die im Bezug auf die von Wikipedia unterstützten Sprachen. Einen großen Beitrag dazu leistete TextImager, das Framework, dass für die zugrunde liegende Architektur verschiedener Methoden und die gesamte Vorverarbeitung der Texte verantwortlich ist. TextImager ist als Multi-Server und Multi-Instanz-Cluster konzipiert, sodass eine verteilte Verarbeitung von Daten ermöglicht wird. Hierfür werden Cluster-Management-Dienste UIMA-AS und UIMA-DUCC verwendet. Darüber hinaus ermöglicht die Multi-Service-Architektur von TextImager die Integration beliebiger NLP-Tools und deren gemeinsame Ausführung. Zudem bietet der TextImager eine webbasierte Benutzeroberfläche, die eine Reihe von interaktiven Visualisierungen bietet, die die Ergebnisse der Textanalyse darstellen. Das Webinterface erfordert keine Programmierkenntnisse - durch einfaches Auswählen der NLP-Komponenten und der Eingabe des Textes wird die Analyse gestartet und anschließend visualisiert, so dass auch Nicht-Informatiker mit diesen Tools arbeiten können. Zudem haben wir die Integration des statistischen Frameworks R in die Funktionalität und Architektur von TextImager demonstriert. Hier haben wir die OpenCPU-API verwendet, um R-Pakete auf unserem eigenen R-Server bereitzustellen. Dies ermöglichte die Kombination von R-Paketen mit den modernsten NLP-Komponenten des TextImager. So erhielten die Funktionen der R-Pakete extrahierte Informationen aus dem TextImager, was zu verbesserten Analysen führte. Darüber hinaus haben wir interaktive Visualisierungen integriert, um die von R abgeleiteten Informationen zu visualisieren. Einige der im TextImager entwickelten Visualisierungen sind besonders herausragend und haben in vielen Bereichen Anwendung gefunden. Ein Beispiel dafür ist PolyViz, ein interaktives Visualisierungssystem, das die Darstellung eines multipartiten Graphen ermöglicht. Wir haben PolyViz anhand von zwei verschiedenen Anwendungsfällen veranschaulicht. SemioGraph, eine Visualisierungstechnik zur Darstellung multikodaler Graphen wurde auch vorgestellt. Die visuellen und interaktiven Funktionen von SemioGraph wurden mit einer Anwendung zur Visualisierung von Worteinbettungen vorgestellt. Wir haben gezeigt, dass verschiedene Modelle zu völlig unterschiedlichen Grafiken führen können. So kann Semiograph bei der Suche nach Worteinbettungen für bestimmte NLP-Aufgaben helfen. Inspiriert von all den Textvisualisierungen im TextImager ist die Idee für text2voronoi geboren. Hier stellten wir einen neuartigen Ansatz zur bildgetriebenen Textklassifizierung vor, der auf einem Voronoi-Diagram linguistischer Merkmale basiert. Dieser Klassifikationsansatz wurde auf die automatische Patientendiagnose angewendet und wir haben gezeigt, dass wir das traditionelle Bag-Of-Words-Modell sogar übertreffen. Dieser Ansatz ermöglicht es, die zugrunde liegenden Merkmale anschließend zu analysieren und damit einen ersten Schritt zur Lösung der Black Box zu machen. Wir haben text2voronoi auf literarische Werke angewendet und die entstandenen Visualisierungen auf einer webbasierten Oberfläche (LitViz) präsentiert. Hier ermöglichen wir den Vergleich von Voronoi-Diagrammen der verschiedenen Literaturen und damit den visuellen Vergleich der Sprachstile der zugrunde liegenden Autoren. Mit unserer Kompetenz in der Vorverarbeitung und der Analyse von Texten sind wir unserem Ziel der semantischen Dokumentenanalyse einen Schritt näher gekommen. Als nächstes haben wir die Auflösung der Sinne auf der Wortebene untersucht. Hier stellten wir fastSense vor, ein Disambigierungsframework, das mit großen Datenmengen zurecht kommt. Um dies zu erreichen, haben wir einen Disambiguierungskorpus erstellt, der auf Wikipedias 221965 Disambiguierungsseiten basiert, wobei die sich auf 825179 Sinne beziehen. Daraus resultierten mehr als 50 Millionen Datensätze, die fast 50 GB Speicherplatz benötigten. Wir haben nicht nur gezeigt, dass fastSense eine so große Datenmenge problemlos verarbeiten kann, sondern auch, dass wir mit unseren Wettbewerbern mithalten und sie bei einigen NLP-Aufgaben sogar übertreffen können. Jetzt, da wir den Wörtern Sinne zuordnen können, sind wir der semantischen Dokumentenanalyse einen weiteren Schritt näher gekommen. Je mehr Informationen wir aus einem Text und seinen Wörtern gewinnen können, desto genauer können wir seinen Inhalt analysieren. Wir stellten zudem einen netzwerktheoretischen Ansatz zur Modellierung der Semantik großer Textnetzwerke am Beispiel der deutschen Wikipedia vor. Zu diesem Zweck haben wir einen Algorithmus namens text2ddc entwickelt, um die thematische Struktur eines Textes zu modellieren. Dabei basiert das Modell auf einem etablierten Klassifikationsschema, nämlich der Dewey Decimal Classification. Mit diesem Modell haben wir gezeigt, wie man aus der Vogelperspektive die Hervorhebung und Verknüpfung von Themen, die sich in Millionen von Dokumenten manifestiert, darstellt. So haben wir eine Möglichkeit geschaffen, die thematische Dynamik von Dokumentnetzwerken automatisch zu visualisieren. Die Trainings- und Testdaten, die wir in diesem Kapitel hatten, bestanden jedoch hauptsächlich aus kurzen Textausschnitten. Zudem haben wir DDC Korpora erstellt, indem wir Informationen aus Wikidata, Wikipedia und der von der Deutschen Nationalbibliothek verwalteten Gemeinsamen Normdatei (GND) vereinigt haben. Auf diese Weise konnten wir nicht nur die Datenmenge erhöhen, sondern auch Datensätze für viele bisher unzugängliche Sprachen erstellen. Wir haben text2ddc so weit optimiert, dass wir einen F-score von 87.4% erzielen für die 98 Klassen der zweiten DDC-Stufe. Die Vorverarbeitung von TextImager und die Disambiguierung durch fastSense hatten einen großen Einfluss darauf. Für jedes Textstück berechnet text2ddc eine Wahrscheinlichkeitsverteilung über die DDC-Klassen berechnen Der klassifikatorinduzierte semantische Raum von text2ddc wurde auch zur Verbesserung weiterer NLP-Methoden genutzt. Dazu gehört auch text2wiki, ein Framework für automatisches Tagging nach dem Wikipedia-Kategoriensystem. Auch hier haben wir einen klassifikatorinduzierten semantischen Raum, aber diesmal basiert er auf dem Wikipedia-Kategoriensystem. Ein großer Vorteil dieses Modells ist die Präzision und Tiefe der behandelten Themen und das sich ständig weiterentwickelnde Kategoriesystem. Damit sind auch die Kriterien eines offenen Themenmodells erfüllt. Um die Vorteile von text2wiki zu demonstrieren, haben wir anschließend die von text2wiki bereitgestellten Themenvektoren verwendet, um text2ddc zu verbessern, so dass sich beide Systeme gegenseitig verbessern können. Die Synergie zwischen den erstellten Methoden in dieser Dissertation war entscheidend für den Erfolg jeder einzelnen Methode

    Yeni ekonomi’ ye rekabetçi yaklaşım

    No full text
    Çalışmamız toplam dört bölümden oluşmaktadır. Birinci bölüm giriş bölümüdür. Bu bölümde araştırma ve yazım biçimi, bölümlerin düzeni ve çalışmanın genel sistematiği açıklanmıştır. İkinci bölümde yeni ekonomi tanımı ve içeriği incelenmiştir. Yeni ekonomi'nin içeriği blümünde teknolojik (bilişim teknolojileri) değişim, iş süreçleri'nin yeniden yapılandırılması, Entegrasyon, elektronik ticaret, elektronik iş ve yeni ekonomide değer yaratma biçimleri ele alınmıştır. Bu konular elektronik ticaretin teknik ve yönetsel içeriğini, oluşum sürecini ve onu ortaya çıkaran sebepleri incelemek amacıyla oluşturulmuşlardır. Üçüncü bölümde yeni ekonominin işletmelerin yapılarına etkieri incelenmeye çalışılmıştır. Sonuç bölümünde ise üçüncü bölümde incelenen değişimlerin Yine üçüncü bölümün başında sıralanan rekabet kriterleri doğrultusunda irdelenmesi ve bir sonuca ulaşılmasına çalışılmıştır. Yeni ekonominin kabul görmüş, benimsenmiş bir tanımını bulmak, bu çalışmayı yaparken karşılaştığımız en güç iş olmuştur. Zira her zaman olduğu gibi yeni ekonomi, e-ticaret ile eşanlamlı olarak kullanılıyordu. Bu da yeni ekonomi'nin (varsa) kapsamını daraltmaktadır. Birçok önemli isim yeni ekonomi'nin gerçekte var olmadığını yeni ve eski ekonomilerden değil 'ekonomi'den söz edilebileceğini yazmakta idi ki bu savunmalar da hiç gözardı edilebilecek cinsten değildi. Yeni ekonomi'nin işletmelerin rekabetçi gücüne etkilerini incelerken önce rekabet kriterlerinin göz önüne alınmasında fayda vardır. Önce kriterlerin incelenmeli sonra yeni ekonomi'nin işletmeler üzerinde dayattığı değişiklikler incelenmeli ve en son olarak da bu değişikliklerin önceden ele alınan kriterler doğrultusunda yargılaması yapılmalı ve bir neticeye varılmalıdır. Yeni ekonomiye ait değişkenlerin işletmeler'in mevcut dahili yapılarına ve harici fonksiyonlarına etkilerini inceledikten sonra; daha önce (bölüm 3.1) sıralanan Porter'ın rekabet kriterleri doğrultusunda bu etkiler irdelenebilir. Ancak bu etkilerin yeni ekonomiye ait süreçlere geçseler dahi fiziksel mal ya da hizmet üreten ve bunları piyasaya on-line ya da fiziksel dağıtım kanalları ile sunan yeni ya da eski işletmler üzerindeki etkiler olduğu hatırlanmalıdır. yani kısaca bu çalşmaya; dot-com adı altında lenen ve açık ve anlaşılabilir iş hedefleri ve operasyon planlarından mahrum olarak sadece bir şekilde internette bulunmak ve başta reklam olmak üzere tali yöntemlerle gelir elede etme peşindeki sanal işletmeler dahil edilmemiştir. SUMMARY This study contains four chapters. First chapter is introduction. İn this chapter ı explained the method, objectives contants and assumptions of the study. İn the second chapter ı tried to make a definition of new economy and define the containts of it. İn the contants section, ı tried to explain techological change, Business process reengineering, İntegration, Electronic Commerce , Electronic Business and new ways of creating value. These subjects are studied to explain the tecnical, administrative contants. İnthis chapter ı had a difficulty of limiting the contants of the chapter. I should limit it according to the impacts of competitiviness. İt was difficult becouse 'new economy' was extreemly new and the contans of it was not purified briefly. So ı did not include many subject as legal issues, security issues, macro economical issues etc. İn the third chapter ı tried to explain the effects of new economy to enterprises. These effecs are divided into two categories as the effect to internal process of enterprises and the effects to outside process as the relation with suppliers and customers. This time because of the advanced integration capabilities; had a classificication problem of identifiying the internal and external parts of enterprise İn the last chapter, ı tried to make a conclucion about the effects of new economy according to the criterias shown in the chapter three. İt is difficult to have a generally eccepted defginition of new economy in the literature. It was the hardest part of this research. Because in the sources ı used; the name 'new economy' was used to mention 'e-commerce. Many important write was against of the seperation of the economy as new and old. In a study which aims to explain the competitiveness of new economy, it is important to identify the criterias of competitiveness. Firs of all the criterias are need to be considered and then the effects of new economy to the enterprisess process. Lastly these effects are need to be criticized according to the ciriterias shown before. This was the objective of this study. It is important to keep in mind that the enterprises that are mentioned in this study are the estapllished firms which are alreadliy producing goods or/and services. This means that the virtual firms named dot-coms that do not have a brief business plan and mission are not included in this study

    Industrial sludge remediation with photonic treatment using Ti-Ag nano-composite thin films: Persistent organic pollutant removal from sludge matrix

    No full text
    Mechanically dewatered industrial sludge (MDIS) was treated using pure and silver-doped thin films (TFs) grown on quartz substrates. TFs were prepared using a sol-gel dip coating technique. The resulting films were annealed at 450 degrees C for 3 h and characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), transmission electron microscopy (TEM), atomic force microscopy (AFM) and X-ray photoelectron spectroscopy (XPS). Mixtures that were homogeneous in the UV A (380 nm) and UVvis (450 nm) regions of the electromagnetic spectrum were used as the irradiation source. The results revealed that illumination with different wavelengths helps to generate well-separated e(-)/h(+) pairs, resulting in a decrease in the recombination rate. An electron transfer chain model was also developed using the experimental results. The performance of the applied method was evaluated by observing variations in the sludge bound water content (SBWC), volatile solids removal rate (VSR), and the consumed and generated energy fluxes through endergonic and exergonic reactions. After treatment, SBWC was reduced from 65% +/- 1% to 39% +/- 1 and the highest VSR was measured to be 27 +/- 0.1 mg VSS cm(-2) h(-1). The consumed and recovered energy fluxes were 960 +/- 151 and 412 +/- 26 J g(-1) VSremoved, respectively. Raw sludge and polychlorinated biphenyls (Sigma 15PCB) and polyaromatic hydrocarbon (Sigma(16)PAH) concentrations were 4356.82 +/- 22 mu g kg(-1) and 446.25 +/- 4.8 mu g kg(-1), respectively. The Sigma(15) PCB and E-16 PAH concentrations in the treated sludge samples were 129.86 +/- 22 mu g kg(-1) and 34.85 +/- 13 mu g kg(-1), respectively. (C) 2014 Elsevier Ltd. All rights reserved.Scientific and Technological Research Council of Turkey TUBITAKTurkiye Bilimsel ve Teknolojik Arastirma Kurumu (TUBITAK) [111Y209]; Cerkezkoy Organized Industrial Zone ManagementThis work is founded by The Scientific and Technological Research Council of Turkey TUBITAK (Project 111Y209) and Cerkezkoy Organized Industrial Zone Management

    A Review of Dehydration of Various Industrial Sludges

    No full text
    Wastewater characteristics and sludge generation potential of point source categories are reviewed critically. Novel industry-specific sludge dewatering/drying solutions necessary to establish a sustainable model are examined through a detailed literature survey. Knowledge of sludge properties is one of the most critical issues needed to design dewatering/drying equipment. This study focuses on industrial wastewater/sludge characterization. In addition, a comprehensive review of current drying models and technologies is also presented. A summary of the results derived from a novel thin-film-based photonic sludge dewatering/drying study is outlined as an alternative approach for industrial sludge control. Sludge was dried in a tubular quartz reactor (TQR), the inner surface of which was coated with a TiO2 thin film. The TQR was irradiated with UV A, UV B, and UV C lamps. The consumed and generated energy fluxes through endergonic and exergonic reactions driven by photolysis and photocatalysis were investigated. In addition, the variations in sludge dewatering/drying characteristics were also examined and compared with conventional methods to evaluate the energy requirements.Scientific and Technological Research Council of Turkey TUBITAKTurkiye Bilimsel ve Teknolojik Arastirma Kurumu (TUBITAK) [111Y209]; Cerkezkoy Organized Industrial ManagementThis work is funded by The Scientific and Technological Research Council of Turkey TUBITAK (Project 111Y209) and Cerkezkoy Organized Industrial Management
    corecore